Analytical Biochemistry
○ Elsevier BV
Preprints posted in the last 7 days, ranked by how well they match Analytical Biochemistry's content profile, based on 26 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit.
Johansson, J.; Palonen, S.; Egorova, K.; Tuisku, J.; Harju, H.; Kärpijoki, H.; Maaniitty, T.; Saraste, A.; Saari, T.; Tuomola, N.; Rinne, J.; Nuutila, P.; Latva-Rasku, A.; Virtanen, K. A.; Knuuti, J.; Nummenmaa, L.
Show abstract
BackgroundQuantitative cerebral blood flow (CBF) measured with [15O]water positron emission tomography (PET) is the reference standard for quantifying brain perfusion. However, clinical interpretation of individual CBF measurements is limited by the absence of large normative datasets accounting for physiological variability across the adult lifespan. Long-axial field-of-view PET enables high-sensitivity quantitative [15O]water perfusion imaging without arterial blood sampling, allowing normative characterization of cerebral perfusion at unprecedented scale. The aim of this study was to establish normative and covariate-adjusted models of cerebral blood flow across the adult lifespan using total-body [15O]water PET. MethodsQuantitative CBF measurements were obtained in 302 neurologically healthy adults (age 21-86 years) using total-body [15O]water PET. Linear mixed-effects models were used to evaluate the effects of age, sex, body mass index (BMI), and blood hemoglobin concentration on CBF and to generate normative prediction models across the adult lifespan. Between-subject and within-subject variability were estimated from repeated scans in a subset of participants (n=51). ResultsMean grey matter CBF was 46.1 mL/(min*dL), with substantial inter-individual variability but high within-subject reproducibility (intraclass correlation coefficients 0.78-0.89). Advancing age was associated with a decline in CBF of approximately 7% per decade (p_FDR < 10-12). Higher BMI was associated with lower CBF (approximately -6% per 10 kg/m2; p_FDR < 0.01). Women exhibited higher CBF than men (approximately 7.5%), but this difference was largely explained by lower blood hemoglobin concentration in women. Covariate-adjusted models were used to generate normative predictions and prediction intervals describing expected CBF across adulthood. ConclusionThis study establishes a normative database of quantitative cerebral blood flow across the adult lifespan using high-sensitivity [15O]water PET. Age, BMI, and hemoglobin are major determinants of inter-individual variability in CBF. The resulting generative models provide a quantitative reference framework for interpreting cerebral perfusion measurements and may enable automated detection of abnormal brain perfusion in clinical PET imaging.
Gangolli, M.; Perkins, N. J.; Marinelli, L.; Basser, P. J.; Avram, A. V.
Show abstract
BACKGROUNDMild traumatic brain injury (mTBI) is a signature injury in civilian and military populations that remains invisible to detection by conventional radiological methods. Diffusion MRI has been identified as a potential clinical tool for revealing subtle microstructural alterations associated with mTBI. OBJECTIVEThis study evaluates whether a comprehensive and powerful diffusion MRI (dMRI) technique called mean apparent propagator (MAP) MRI can detect sequelae of mTBI. METHODSWe analyzed data from 417 participants of the GE/NFL prospective mTBI study which included 143 matched controls (mean age, 21.9 {+/-} 8.3 years; 76 women) and 274 patients with acute mTBI and GCS [≥]13 (mean age, 21.9 {+/-} 8.5 years; 131 women). All participants underwent MRI exams at up to four visits including structural high-resolution T1W, T2W, FLAIR-T2W, and dMRI, in addition to clinical assessments of post-concussive physical symptoms (RPQ-3), psychosocial functioning and lifestyle symptoms (RPQ-13), and postural stability (BESS). The dMRI data for each subject were co-registered across all visits and analyzed using the MAP-MRI framework to measure and map the distribution of net microscopic displacements of diffusing water molecules in tissue and ultimately compute the microstructural MAP-MRI tissue parameters including propagator anisotropy (PA), Non-Gaussianity (NG), return-to-origin probability (RTOP), return-to-axis probability (RTAP), and return-to-plane probability (RTPP). We quantified voxel-wise and region-of-interest (ROI)-based changes in these parameters across all four visits. RESULTSMAP-MRI parameter values were within the expected ranges and showed relatively little variation across visits. We found no significant differences in the longitudinal trajectories of these parameters between mTBI patients and controls. At acute post-injury timepoints, RPQ-3 and RPQ-13 scores were increased in mTBI patients relative to controls, while BESS scores were not significantly different between groups. Analysis of dMRI metrics and clinical mTBI markers showed significant correspondence between MAP-MRI metrics in cortical gray matter, caudate and pallidum and BESS scores. CONCLUSIONWe developed and tested a state-of-the-art quantitative image processing pipeline for sensitive analysis and detection of subtle tissue changes in longitudinal clinical diffusion MRI data. The absence of a significant statistical difference between populations in the dMRI parameters in this study suggests that the mTBI corresponded to acute post-injury clinical symptoms but that the injury was not severe enough to cause detectable microstructural damage/alterations, and that increased diffusion sensitization combined with improved analysis techniques may be needed. CLINICAL IMPACTThese findings suggest that acute mTBI (GCS[≥]13) may not be detectable with diffusion MRI. TRIAL REGISTRATIONClinicalTrials.gov NCT02556177
Chihara, A.; Mizuno, R.; Kagawa, N.; Takayama, A.; Okumura, A.; Suzuki, M.; Shibata, Y.; Mochii, M.; Ohuchi, H.; Sato, K.; Suzuki, K.-i. T.
Show abstract
Fluorescent in situ hybridization (FISH) enables highly sensitive, high-resolution detection of gene transcripts. Moreover, by employing multiple probes, this technique allows for multiplexed, simultaneous detection of distinct gene expression patterns spatiotemporally, making it a valuable spatial transcriptomics approach. Owing to these advantages, FISH techniques are rapidly being adopted across diverse areas of basic biology. However, conventional protocols often rely on volatile, toxic reagents such as formalin or methanol, posing potential health risks to researchers. Here, we present a safer protocol that replaces these chemicals with low-toxicity alternatives, without compromising the high detection sensitivity of FISH. We validated this protocol using both in situ hybridization chain reaction (HCR) and signal amplification by exchange reaction (SABER)-FISH in frozen sections of various model organisms, including mouse (Mus musculus), amphibians (Xenopus laevis and Pleurodeles waltl), and medaka (Oryzias latipes). Our results demonstrate successful multiplexed detection of morphogenetic and cell-type marker genes in these model animals using this safer protocol. The protocol has the additional advantage of requiring no proteolytic enzyme treatment, thus preserving tissue integrity. Furthermore, we show that this protocol is fully compatible with EGFP immunostaining, allowing for the simultaneous detection of mRNAs and reporter proteins in transgenic animals. This protocol retains the benefits of highly sensitive, multiplexed, and multimodal detection afforded by integrating in situ HCR and SABER-FISH with immunohistochemistry, while providing a safer option for researchers, thereby offering a valuable tool for basic biology.
Sivakumar, E.; Anand, A.
Show abstract
Computer vision and deep learning techniques, including convolutional neural networks (CNNs) and transformers, have increased the performance of medical image classification systems. However, training deep learning models using medical images is a challenging task that necessitates a substantial amount of annotated data. In this paper, we implement data augmentation strategies to tackle dataset imbalance in the VinDr-SpineXR dataset, which has a lower number of spine abnormality X-ray images compared to normal spine X-ray images. Geometric transformations and synthetic image generation using Generative Adversarial Networks are explored and applied to the abnormal classes of the dataset, and classifier performance is validated using VGG-16 and InceptionNet to identify the most effective augmentation technique. Additionally, we introduce a hybrid augmentation technique that addresses class imbalance, reduces computational overhead relative to a GAN-only approach, and achieves ~99% validation accuracy with both classifiers across all three case studies. Keywords: Data augmentation, Generative Adversarial Network, VGG-16, InceptionNet, Class imbalance, Computer vision, Spine X-ray, Radiology.
Harikumar, A.; Baker, B.; Amen, D.; Keator, D.; Calhoun, V. D.
Show abstract
Single photon emission computed tomography (SPECT) is a highly specialized imaging modality that enables measurement of regional cerebral perfusion and, in particular, resting cerebral blood flow (rCBF). Recent technological advances have improved SPECT quantification and reliability, making it increasingly useful for studying rCBF abnormalities and perfusion network alterations in psychiatric and neurological disorders. To characterize large scale functional organization in SPECT data, data driven decomposition methods such as independent component analysis (ICA) have been used to extract covarying perfusion patterns that map onto interpretable brain networks. Blind ICA provides a data driven approach to estimate these networks without strong prior assumptions. More recently, a hybrid approach that leverages spatial priors to guide a spatially constrained ICA (sc ICA) have been used to fully automate the ICA analysis while also providing participant-specific network estimates. While this has been reliably demonstrated in fMRI with the NeuroMark template, there is currently no comparable SPECT template. A SPECT template would enable automatic estimation of functional SPECT networks with participant-specific expressions that correspond across participants and studies. The current study introduces a new replicable NeuroMark SPECT template for estimating canonical perfusion covariance patterns (networks). We first identify replicable SPECT networks using blind ICA applied to two large sample SPECT datasets. We then demonstrate the use of the resulting template by applying sc-ICA to an independent schizophrenia dataset. In sum, this work presents and shares the first NeuroMark SPECT template and demonstrating its utility in an independent cohort, providing a scalable and robust framework for network-based analyses.
Xu, M.; Philips, R.; Singavarapu, A.; Zheng, M.; Martin, D.; Nikolin, S.; Mutz, J.; Becker, A.; Firenze, R.; Tsai, L.-H.
Show abstract
Background: Gamma oscillation dysfunction has been implicated in neuropsychiatric disorders. Restoring gamma oscillations via brain stimulation represents an emerging therapeutic approach. However, the strength of its clinical effects and treatment moderators remain unclear. Method: We conducted a systematic review and meta-analysis to examine the clinical effects of gamma neuromodulation in neuropsychiatric disorders. A literature search for controlled trials using gamma stimulation was performed across five databases up until April 2025. Effect sizes were calculated using Hedge's g. Separate analyses using the random-effects model examined the clinical effects in schizophrenia (SZ), major depressive disorder (MDD), bipolar disorder, and autism spectrum disorder. For SZ and MDD, subgroup analyses evaluated the effects of stimulation modality, stimulation frequency, treatment duration, and pulses per session. Result: Fifty-six studies met the inclusion criteria (NSZ = 943, NMDD = 916, NBD = 175, NASD = 232). In SZ, gamma stimulation was associated with improvements in positive (k = 10, g = -0.60, p < 0.001), negative (k = 12, g = -0.37, p = 0.03), depressive (k = 8, g = -0.39, p < 0.001), anxious symptoms (k = 5, g = -0.59, p < 0.001), and overall cognitive function (k = 7, g = 0.55, p < 0.001). Stimulation frequency and treatment duration moderated therapeutic effects. In MDD, reductions in depressive symptoms were observed (k = 23, g = -0.34, p = 0.007). Conclusion: Gamma neuromodulation showed moderate therapeutic benefits in SZ and MDD. Substantial heterogeneity likely reflects protocol differences, highlighting the need for well-powered future trials.
Jacobsen, A. M.; Quednow, B. B.; Bavato, F.
Show abstract
ImportanceBlood neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) are entering clinical use in neurology as markers of neuroaxonal and astrocytic injury, but their utility in psychiatry is unclear. ObjectiveTo determine whether psychiatric diagnoses are associated with altered plasma NfL and GFAP levels. Design, Setting, and ParticipantsThis population-based study examined plasma NfL and GFAP among 47,495 participants from the UK Biobank (54.0% female; 93.5% White; mean [SD] age 56.8 [8.2] years) who provided blood samples and sociodemographic and clinical data between 2006 and 2010. Normative modeling was applied to assess associations between 7 lifetime psychiatric diagnostic categories and deviations from expected NfL and GFAP levels, while accounting for neurological diagnoses, cardiometabolic burden, and substance use. Data were analyzed between July 2025 and March 2026. Main Outcomes and MeasuresDeviations in plasma NfL and GFAP levels from normative predictions. ResultsRelative to the reference population, plasma NfL levels were higher among individuals with bipolar disorder (d=0.20; 95% CI, 0.03-0.37; p=0.03), recurrent depressive disorder (d=0.23; 95% CI, 0.07-0.38; p=0.009), and depressive episodes (d=0.06; 95% CI, 0.02-0.10; p=0.01), lower among individuals with anxiety disorders (d=-0.07; 95% CI, -0.12 to -0.02; p=0.008), but did not differ in schizophrenia spectrum, stress-related, or other psychiatric disorders. Plasma GFAP levels were not elevated in any psychiatric disorders. Variability in NfL levels was greater among individuals with schizophrenia spectrum disorders (variance ratio [VR]=1.30; p=0.005), depressive episodes (VR=1.06; p=0.006), and anxiety disorders (VR=1.08; p=0.005). Variability in GFAP levels was increased only in anxiety disorders (VR=1.08; p=0.01). Plasma NfL levels exceeding percentile-based normative thresholds were more common among individuals with schizophrenia spectrum disorders, bipolar disorder, recurrent depressive disorder, and depressive episodes. Neurological diagnoses, cardiometabolic burden, and substance use were associated with plasma NfL and GFAP levels. Conclusions and RelevanceThis study provides population-level evidence of plasma NfL elevation in bipolar and depressive disorders and increased variability in schizophrenia spectrum, bipolar and depressive disorders, supporting its potential as a biomarker in psychiatry and informing its ongoing neurological applications. Plasma GFAP levels, in contrast, were largely unaltered across psychiatric disorders. Key PointsO_ST_ABSQuestionC_ST_ABSAre plasma neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) levels altered in psychiatric disorders? FindingsIn this cohort study including 47,495 individuals, normative modeling revealed that plasma NfL levels were elevated in bipolar and depressive disorders, whereas plasma GFAP levels were not elevated in any psychiatric disorder. Plasma NfL levels also showed higher variability in schizophrenia spectrum, bipolar, and depressive disorders. MeaningPlasma NfL shows distinct alterations in schizophrenia spectrum and affective disorders, supporting its further investigation as a biomarker in clinical psychiatry and highlighting the need to consider psychiatric comorbidity in neurological applications.
Quide, Y.; Lim, T. E.; Gustin, S. M.
Show abstract
BackgroundEarly-life adversity (ELA) is a risk factor for enduring pain in youth and is associated with alterations in brain morphology and function. However, it remains unclear whether ELA-related neurobiological changes contribute to the development of enduring pain in early adolescence. MethodsUsing data from the Adolescent Brain Cognitive Development (ABCD) Study, we examined multimodal magnetic resonance imaging (MRI) markers in children assessed at baseline (ages 9-11 years) and at 2-year follow-up (ages 11-13 years). ELA exposure was defined at baseline to maximise temporal separation between early adversity and later enduring pain. Participants with enduring pain at follow-up (n = 322) were compared to matched pain-free controls (n = 644). Structural MRI, diffusion MRI (fractional anisotropy, mean diffusivity), and resting-state functional connectivity data were analysed. Linear models tested main effects of enduring pain, ELA, and their interaction on brain metrics, controlling for relevant covariates. ResultsELA exposure was associated with smaller caudate and nucleus accumbens volumes, and reduced surface area of the left rostral middle frontal gyrus. No significant effects of enduring pain or ELA-by-enduring pain interaction were observed across grey matter, white matter, or functional connectivity measures. ConclusionsELA was associated with alterations in fronto-striatal regions in late childhood, but these changes were not linked to enduring pain in early adolescence. These findings suggest that ELA-related neurobiological alterations may represent early markers of vulnerability rather than concurrent correlates of enduring pain. Longitudinal follow-up is needed to determine whether these alterations contribute to later chronic pain risk.
Spann, D. J.; Hall, L. M.; Moussa-Tooks, A.; Sheffield, J. M.
Show abstract
BackgroundNegative symptoms are core features of schizophrenia that relate strongly to functional impairment, yet interventions targeting these symptoms remain largely ineffective. Emerging theoretical work highlights how environmental factors may shape and maintain negative symptoms. Although racial disparities in schizophrenia diagnosis among Black Americans are well documented and linked to racial stress and psychosis, the impact of racial stress on negative symptoms has not been examined. This study provides an initial test of a novel theory proposing that racial stress - here measured by racial discrimination - influences negative symptom severity through exacerbation of negative cognitions about the self, particularly defeatist performance beliefs (DPB). Study DesignParticipants diagnosed with schizophrenia-spectrum disorder (SSD) (N = 208; 80 Black, 128 White) completed the Positive and Negative Syndrome Scale (PANSS), the Defeatist Beliefs Scale, and self-report measures of subjective racial and ethnic discrimination (Racial and Ethnic Minority Scale and General Ethnic Discrimination Scale). Relationships among variables were tested using linear regression and mediation analysis. Study ResultsBlack participants exhibited significantly greater total and experiential negative symptoms than White participants with no group difference in DPB. Racial discrimination explained 46% of the relationship between race and negative symptoms. Among Black participants, higher DPB were associated with greater negative symptom severity. Discrimination was positively related to both DPB and negative symptoms. DPB partially mediated the relationship between discrimination and negative symptoms. ConclusionsFindings suggest that racial stress contributes to negative symptom severity via defeatist beliefs among Black individuals, highlighting potential targets for culturally informed interventions.
Xu, J.; Parker, R. M. A.; Bowman, K.; Clayton, G. L.; Lawlor, D. A.
Show abstract
Background Higher levels of sedentary behaviour, such as leisure screen time (LST), and lower levels of physical activity are associated with diseases across multiple body systems which contribute to a large global health burden. Whether these associations are causal is unclear. The primary aim of this study is to investigate the causal effects of higher LST (given greater power) and, secondarily, lower moderate-to-vigorous intensity physical activity (MVPA), on a wide range of diseases in a hypothesis-free approach. Methods A two-sample Mendelian randomisation phenome-wide association study was conducted for the main analyses. Genetic single nucleotide polymorphisms (SNPs) were first selected as exposure genetic instruments for LST (hours of television watched per day; 117 SNPs) and MVPA (higher vs. lower; 18 SNPs) based on the genome-wide significant threshold (p < 5*10-8) from the largest relevant genome-wide association study (GWAS). For disease outcomes, we used summary results from FinnGen GWAS, including 1,719 diseases defined by hospital discharge International Classification of Diseases (ICD) codes in 453,733 European participants. For the main analyses, we used the inverse-variance weighting method with a Bonferroni corrected p-value of p [≤] 3.47*10-4. Sensitivity analyses included Steiger filtering, MR-Egger and weighted median analyses, and data from UK Biobank were used to explore replication. Findings Genetically predicted higher LST was associated with increased risk of 87 (5.1% of the 1,719) diseases. Most of these diseases were in musculoskeletal and connective tissue (n=37), genitourinary (n=12) and respiratory (n=8) systems. Genetic liability to lower MVPA was associated with six diseases: three in musculoskeletal and connective tissue and genitourinary systems (with greater risk of these diseases also identified with higher LST), and three in respiratory and genitourinary systems. Sensitivity analyses largely supported the main analyses. Results replicated in UK Biobank, where data available. Conclusions Higher levels of sedentary behaviour, and lower levels of physical activity, causally increase the risk of diseases across multiple body systems, making them promising targets for reducing multimorbidity.
Pietilainen, O.; Salonsalmi, A.; Rahkonen, O.; Lahelma, E.; Lallukka, T.
Show abstract
Objectives: Longer lifespans lead to longer time on retirement, despite the efforts to raise the retirement age. Therefore, it is important to study how the retirement years can be spent without diseases. This study examined socioeconomic and sociodemographic differences in healthy years spent on retirement. Methods: We followed a cohort of retired Finnish municipal employees (N=4231, average follow-up 15.4 years) on national administrative registers for major chronic diseases: cancer, coronary heart disease, cerebrovascular disease, diabetes, asthma or chronic obstructive pulmonary disease, dementia, mental disorders, and alcohol-related disorders. Median healthy years on retirement and age at first occurrence of illness (ICD-10 and ATC-based) in each combination of sex, occupational class, and age of retirement were predicted using Royston-Parmar models. Prevalence rates for each diagnostic group were calculated. Results: Most healthy years on retirement were spent by women having worked in semi-professional jobs who retired at age 60-62 (median predicted healthy years 11.6, 95% CI 10.4-12.7). The least healthy years on retirement were spent by men having worked in routine non-manual jobs who retired after age 62 (median predicted healthy years 6.5, 95% CI 4.4-9.5). Diabetes was slightly more common among lower occupational class women, and dementia among manual working women having retired at age 60-62. Discussion: Healthy years on retirement are not enjoyed equally by women and men and those who retire early or later. Policies aiming to increase the retirement age should consider the effects of these gaps on retirees and the equitability of those effects.
Hung, J.; Smith, A.
Show abstract
The global ambition to end the human immunodeficiency virus (HIV) epidemic requires understanding which system-level policy levers, enacted under the framework of Universal Health Coverage (UHC), are most effective in achieving both transmission reduction and diagnostic coverage. This study addresses an important evidence gap by quantifying the within-country association between measurable UHC policy indicators and the estimated rate of new HIV infections across nine Southeast Asian countries between 2013 and 2022. Employing a Fixed-Effects panel data methodology, the analysis controls for time-invariant national heterogeneity, ensuring reliable estimates of policy impact. We found that marginal changes in total current health expenditure (CHE) as a percentage of gross domestic product (GDP) were not statistically significantly associated with changes in HIV incidence. However, increases in the UHC Infectious Disease Service Coverage Index were statistically significantly associated with concurrent reductions in HIV incidence (p < 0.001), suggesting the efficacy of targeted service implementation as the principal driver of curbing new HIV infections. In addition, the UHC Reproductive, Maternal, Newborn, and Child Health Service Coverage Index exhibited a statistically significant positive association with changes in HIV incidence (p < 0.01), which is interpreted as a vital surveillance artefact resulting from expanded detection and reporting of previously undiagnosed HIV cases. Furthermore, out-of-pocket (OOP) health expenditure as a percentage of CHE showed a counter-intuitive negative association with changes in HIV incidence (p < 0.01), suggesting this metric primarily shows ongoing indirect cost burdens on the established patient cohort, or, alternatively, presents a diagnostic access barrier that results in lower case finding. These findings suggest that policymakers should prioritise investment in targeted infectious disease service efficacy over aggregate fiscal commitment and utilise integrated sexual health platforms for strengthened HIV surveillance and case identification.
Hassan, S. S.; Nordqvist-Kleppe, S.; Asinger, N.; Wang, J.; Dillner, J.; Arroyo Muhr, L. S.
Show abstract
Human papillomavirus (HPV) testing is the primary method for cervical cancer screening, and a negative HPV test is associated with a very low subsequent risk of invasive cancer. Nevertheless, a small number of cervical cancers are diagnosed following an HPV-negative testing result, posing challenges within HPV-based screening pathways. Using nationwide Swedish registry data of HPV testing, we identified women diagnosed with invasive cervical cancer between 2019 and 2024 and reconstructed HPV testing histories from the National Cervical Screening Registry (NKCx). The most recent HPV test prior to diagnosis was defined as the index test, and longitudinal HPV testing trajectories were classified among women with an HPV-negative index test. Of 3,000 women diagnosed with invasive cancer, 243 (8.1%) had an HPV-negative index test. These women were older at diagnosis and more frequently diagnosed at advanced stages compared with women with an HPV-positive index test. Most HPV-negative index tests (66.3%) were performed in the peri-diagnostic period (+/- 30 days). Among women with an HPV-negative index test, 52.7% (128/243) had no prior HPV testing recorded, while the remainder had consistently HPV-negative histories (33.3%, 83/243) or evidence of prior HPV positivity before the index negative test (14%, 32/243). Possible recurrent HPV positivity following an intervening negative test was rare (0.4%, 1/243). HPV-negative screening results preceding invasive cancer reflect heterogeneous screening histories and cannot be explained solely by test failure. Findings highlighting the importance of reaching women earlier in screening programs and show that fluctuating HPV detectability is rare.
Xiao, M.; Girard, Q.; Pender, M.; Rabezara, J. Y.; Rahary, P.; Randrianarisoa, S.; Rasambainarivo, F.; Rasolofoniaina, O.; Soarimalala, V.; Janko, M. M.; Nunn, C. L.
Show abstract
PurposeAntibiotic use (ABU) is a major driver of antimicrobial resistance (AMR), but ABU patterns are poorly understood in low-income countries where the burden of AMR is great and ABU is insufficiently regulated. Here, we report ABU from ten sites ranging from rural villages to small cities in Madagascar, a country with high AMR levels, and present results from modeling to identify factors that may be associated with ABU in this setting. MethodsWe conducted surveys of 290 individuals from ten sites in the SAVA Region of northeast Madagascar to gather data on sociodemographic characteristics, agricultural and animal husbandry practices, recent antibiotic use, the antibiotics that participants recalled using in their lifetimes, and the sources of their antibiotics. Using these data, we conducted statistical analyses with a mixed-effects logistic model to determine which characteristics were associated with recent antibiotic use. ResultsNearly all respondents (N=283, 97.6%) reported ABU in their lifetimes, with amoxicillin being the most widely reported antibiotic (N=255, 90.1% of those reporting ABU). All recalled antibiotics were classified as frontline drugs except for ciprofloxacin. Most respondents who reported antibiotic use also reported obtaining antibiotics without prescriptions from local stores (N=273, 96.5%), while only 52.3% (N=148) reported obtaining antibiotics through a prescriptive route, such as from a health clinic or private doctor. Of the 127 individuals (44.9%) who reported recent ABU, men were found to be significantly less likely to have recently taken antibiotics than women. ConclusionsOur findings provide new insights into ABU in agricultural settings in low-income countries, which have historically been understudied in AMR and pharmacoepidemiologic research. Knowledge of ABU patterns supports understanding of AMR dynamics and AMR control efforts in these contexts, such as interventions on inappropriate antibiotic dispensing. Key pointsO_LIAntibiotic use (ABU) in Madagascar is largely unstudied despite its role in antimicrobial resistance (AMR), which Madagascar faces a high burden of. C_LIO_LIABU was widespread among livestock owners in northeast Madagascar, with the majority of study participants reporting ABU in their lifetimes and most people reporting ABU also having taken antibiotics in the previous three months. C_LIO_LIMost respondents reported obtaining their antibiotics from non-pharmaceutical stores, indicating high levels of unregulated ABU, though more than half also reported sourcing their antibiotics through prescriptive means (like doctors and health clinics). C_LIO_LIMen were less likely than women to have taken antibiotics in the previous three months. C_LIO_LIThese findings support the development of interventions to mitigate the burden of AMR in Madagascar and similar contexts while underscoring the need for more comprehensive research on the drivers and patterns of ABU. C_LI Plain language summaryIn this study, we provide basic information on antibiotic use (ABU) patterns in Madagascar, a country that experiences high levels of resistance but has been particularly understudied in AMR and pharmacological research. We surveyed 290 farmers with livestock from ten sites across northeast Madagascar about their ABU and found that nearly all study participants (N=283, 97.6%) have used antibiotics in their lifetimes, while a little under half of those who reported ABU also reported using antibiotics in the previous three months (N=127, 44.9%). The most used antibiotic was amoxicillin (N=255, 90.1%). Most people obtained their antibiotics from sources that do not require prescriptions, like general stores, indicating that most ABU is unregulated. Through modeling, we also found that men were less likely than women to have taken antibiotics in the previous three months (OR=0.50, CI 0.30-0.82). These findings help us better understand the dynamics of ABU in low-income countries, which have historically been understudied in AMR and pharmacological research. They also support efforts to mitigate the burden of AMR by revealing ABU dynamics that may contribute to the emergence and spread of AMR, as well as identifying targets for intervention to curb inappropriate ABU.
Polonsky, J.; Hudu, S.; Uthman, K.; Katuala, Y.; Evbuomwan, P. E.; Osman, H. J. O.; Sulaiman, A. K.; Adjaho, I. I.; Doumbia, C. O.; Gignoux, E.; Ale, F.
Show abstract
Background During Nigeria's largest recorded diphtheria outbreak, hospital capacity in Kano State was rapidly overwhelmed. Medecins Sans Frontieres introduced home-based care (HBC) for patients with mild disease to prioritise facility-based care for severe cases. We assessed whether HBC was non-inferior to facility-based treatment in terms of mortality, sequelae, and household transmission. Methods We conducted a retrospective matched cohort study. Mild diphtheria cases treated between January 2023 and May 2024 were matched 1:1 by treatment modality (HBC or diphtheria treatment centre [DTC]) on sex, age group, vaccination status, and residence. Conditional logistic regression estimated the association between treatment modality and mortality, with robustness assessed through propensity score weighting, sensitivity analyses, and E-value computation. Findings Of 990 sampled patients, 678 (367 HBC, 311 DTC) were enrolled (68.5%). After adjustment, treatment modality was not independently associated with mortality (HBC vs. DTC: aOR 0.40, 95% CI 0.13-1.30), with similar estimates across sensitivity analyses (E-value 4.40). Clinical complications were the strongest predictor of death (aOR 23.1, 95% CI 1.73-307). Vaccination was protective (aOR 0.28, 95% CI 0.08-0.94) and treatment delay of four or more days increased mortality (aOR 4.15, 95% CI 1.23-14.0). HBC was not associated with increased household transmission or long-term sequelae. Interpretation Vaccination and early treatment, rather than care setting, were the main determinants of survival. When supported by clinical triage and structured follow-up, decentralised care can be used to manage mild cases during diphtheria epidemics in settings with constrained hospital capacity.
Areb, M.; Huybregts, L.; Tamiru, D.; Toure, M.; Biru, B.; Fall, T.; Haddis, A.; Belachew, T.
Show abstract
BackgroundThis study aimed to assess caregiver knowledge of Infant and Young Child Feeding (IYCF), child health, severe acute malnutrition (SAM) screening, and Community-Based Management of Acute Malnutrition (CMAM), its determinants, and associations with IYCF/ WaSH (water, sanitation, and hygiene) practices among caregivers of children 6-59 months with SAM in Ethiopian agrarian and pastoralist settings. MethodData were from the baseline survey of the R-SWITCH Ethiopia cluster-randomized controlled trial (cRCT), which screened [~]28,000 children aged 6-59 months and identified 686 SAM cases. Caregiver knowledge was evaluated using a validated 32-item questionnaire (Cronbachs for internal reliability) and analyzed via linear mixed-effects and Poisson regression models in Stata 17. ResultsCaregiver knowledge was positively associated with improved IYCF/WaSH practices among children aged 6-23 months with SAM, including higher minimum dietary diversity (MDD: IRR=1.50), minimum acceptable diet (MAD: IRR=1.63), and reduced zero vegetable/fruit intake (IRR=0.77), as well as MDD in children aged 24-59 months, improved water access (IRR=1.19), water treatment (IRR=2.02), and handwashing stations (IRR=1.41). Literate ({beta} = 4.1; 95% CI:1.5-6.6, p= 0.016), pregnant({beta} = 4.4; 95% CI:0.9-7.8, 0.018), having child weighing at a health post/ health center ({beta} = 8.9;95% CI:3.5-14.2,p [≤] 0.001), and higher household wealth index ({beta} = 11.8;95% CI:3.6-20.1,p= 0.005) were associated with higher knowledge, while possible depression ({beta} = -0.3;95% CI: -0.5 to 0.0, p= 0.015) was associated with lower knowledge. ConclusionCaregiver knowledge determines better IYCF/WaSH practices among children aged 6-59 months with SAM. Literacy, pregnancy, having child weighing at a health post or health center, and greater household wealth were associated with caregivers knowledge, whereas possible depression was associated with lower knowledge. Integrating context-specific caregiver education and mental health support into CMAM, GMP(Growth monitoring and promotion), and primary care services could enhance feeding/WaSH practices in Ethiopia.
Heffernan, P. M.; van den Berg, H.; Yadav, R. S.; Murdock, C. C.; Rohr, J. R.
Show abstract
BackgroundInsecticides remain the cornerstone of mosquito vector control for malaria, dengue, and other mosquito-borne diseases, yet global patterns of deployment and their socioeconomic and environmental drivers are poorly characterized. Understanding where and why insecticides are used is essential for better targeting control efforts and ensuring they are effective, equitable, and efficient. MethodsWe analyzed annual country-level insecticide-use data from 122 countries (1990-2019), reported as standard spray coverage for insecticide-treated nets (ITNs), residual spraying (RS), spatial spraying (SS), and larviciding (LA). Generalized linear mixed models and hurdle models quantified associations between deployment and disease incidence, human development index (HDI), human population density, temperature, and precipitation. Models were evaluated using repeated cross-validation and applied to generate downscaled predictions of insecticide use at subnational administrative region level 2 (ADM2) globally. FindingsInsecticide deployment increased with malaria and dengue incidence, but this response was substantially stronger in higher-HDI countries, indicating that deployment depends on socioeconomic capacity as well as disease burden that leads to weaker scaling in lower-resource settings. Intervention types exhibited distinct patterns; ITN use tracked malaria burden, whereas infrastructure-intensive approaches (e.g., RS and SS) were concentrated in higher-HDI settings and increased with Aedes-borne disease incidence. Downscaled ADM2-level maps uncovered substantial within-country heterogeneity that is obscured at the national scale, highlighting regions where predicted deployment remains low relative to disease risk across sub-Saharan Africa, South Asia, and parts of Latin America. InterpretationGlobal insecticide deployment reflects not only epidemiological need but also economic and logistical capacity, creating mismatches between risk and control. High-resolution mapping can support more equitable allocation of interventions, guide insecticide resistance stewardship, and improve strategic planning as climate and urbanization reshape mosquito-borne disease risk.
Maneraguha, F. K.; Cote, J.; Bourbonnais, A.; Arbour, C.; Chagnon, M.; Hatem, M.
Show abstract
Background Comprehensive sexuality education (CSE) is essential to the health and well-being of young people. In the Democratic Republic of Congo (DRC), where more than 65% of the population is under the age of 25, access to interpersonal CSE remains limited owing to sociocultural and structural barriers. This exposes young people to persistent socio-sanitary vulnerabilities. In this context, mobile health apps (MHAs) constitute a promising solution, supported by the growing use of smartphones among young Congolese. However, this group's intention to use MHAs for CSE has been the subject of little research to date. Objective The aim of this study was to identify predictors of intention to use MHAs among young Congolese, based on the extended Unified Theory of Acceptance and Use of Technology (UTAUT2). Methods A predictive correlational study was conducted in eight public secondary schools in Bukavu (DRC) with a stratified random sample of 859 students. Predictors of intention to use--performance expectancy (PE), effort expectancy (EE), social influence (SI), facilitating conditions (FC), and perceived risk (PR)--and moderators--age, gender, and past MHA experience--were measured from data collected through a self-administered UTAUT questionnaire. Descriptive and multivariate analyses were run on SPSS version 28. Results Mean age of participants was 16.3 years (SD = 1.5). Boys made up 55.1% of the sample. Overall, 51.0% of the sample owned a smartphone, of which 62.3% reported having easy access to mobile data and 16.2% were already using MHAs to learn about sexual health. Intention to use MHAs was positively influenced by PE ({beta} = 0.523, p < 0.001), EE ({beta} = 0.115, p < 0.001), and SI ({beta} = 0.113, p < 0.001). FC (p = 0.260) and PR (p = 0.631), however, had no significant influence. Age moderated all of the relationships tested (F (1, 849-854) = 9.97-20.82; p [≤] 0.002), with more marked effects observed among younger participants 14-15 years old. The final model explained 44% of the variance, indicating good predictive power. Conclusion Intention to use digital CSE was explained primarily by PE, EE, and SI and moderated by age. To strengthen this intention, stakeholders will need to promote e-interventions that are pertinent, easy to use, socially valorized, and tailored to young people's needs and to the local context.
Griffith, B. C.; Iliassu, S.; Mbanga, C.; Ngenge, B. M.; Patel, S.; Graves, J. C.; Singh, N.; Ndoula, S.; Njoh, A. A.; Gisele, E.; Mngemane, S.; Ajayi, T.; Zultak, L. A.; Saidu, Y.
Show abstract
Cameroon introduced Human papilloma virus vaccine (HPVV) into the routine immunization schedule in October 2020. By the end of 2022, coverage remained low. To increase coverage, Cameroon switched to a country-wide, gender-neutral vaccination (GNV) approach in 2023, coupled with a revamped delivery strategy consisting of Community Dialogues (CDs) and Periodic Intensification of Routine Immunization (PIRIs) activities in selected health districts (HDs). We assessed the impact of these programmatic changes, notably the GNV approach, on HPVV coverage. This retrospective, cross-sectional study measured the effect of GNV and CDs + PIRIs on HPVV coverage among 9-year-old girls in Cameroon (2022-2023). Data on HPVV coverage from all 203 HDs were extracted from DHIS2, and coverage was calculated at the HD level, based on the estimated population eligible of 9-year-old girls. Descriptive statistics and multiple regression models were employed to assess the impact of GNV on vaccination coverage while adjusting for CDs + PIRIs and urban/rural status. In 2023, of the 203 HDs, 115 (56.7%) conducted GNV only, 74 (36.5%) implemented GNV & CDs + PIRIs, and 75.9% (154) were classified as rural. Among age-eligible girls, there was an overall increase in HPV vaccination coverage, with coverage rising 39.2 percentage points from 2022 to 2023. Following multiple linear regression, there was a significant increase in HPVV coverage in HDs with GNV & CDs + PIRIs compared to those with no GNV and no CDs + PIRIs ({beta}:55.5%, 95%CI: 38.7, 72.3, p=0.000). Furthermore, there was a significant increase in HPVV coverage in HDs with GNV only compared to those with no GNV or no CDs + PIRIs ({beta}:28.7%, 95%CI: 12.5, 45.0 p=0.001). Overall, the GNV approach increased HPVV coverage for girls significantly, particularly when implemented alongside CDs + PIRIs.
Malingumu, E. E.; Badaga, I.; Kisendi, D. D.; Pierre Kabore, R. W.; Yeremon, O. G.; Mohamed, M. A.; He, Q.
Show abstract
This study evaluates the feasibility of implementing artificial intelligence (AI)-driven disease surveillance systems at Julius Nyerere International Airport (JNIA) in Tanzania, a key hub for regional and international travel. Through a mixed-methods approach combining qualitative interviews and quantitative surveys, the research assesses the infrastructure, human resource capacity, and regulatory frameworks necessary for AI integration. Findings indicate that while Port Health Officers are strongly optimistic about AIs potential to enhance disease detection, the airport faces significant barriers, including outdated infrastructure, insufficient technical resources, and a lack of trained personnel. Ethical and privacy concerns, particularly surrounding data security, also emerged as key challenges, compounded by limited public awareness and the socio-cultural acceptability of AI systems. Furthermore, the study identifies gaps in national policies and inter-agency coordination that hinder the effective implementation of AI technologies. The research concludes that while current conditions render AI adoption infeasible, strategic investments in infrastructure, workforce training, and policy development could pave the way for future integration, enhancing public health surveillance at JNIA and potentially other airports in low- and middle-income countries. This study contributes critical insights into the barriers and opportunities for AI-driven disease surveillance in low-resource settings, specifically focusing on a high-priority transit point, international airports. It emphasizes the importance of region-specific solutions to enhance health security in East Africa and supports the broader global health agenda by advocating for international collaboration and the development of scalable disease surveillance systems. Future research should explore pilot AI implementations at other airports to evaluate real-world challenges and refine AI systems for broader applicability, including cost-effectiveness analyses and integration of public perspectives on AI.